138 research outputs found

    Object Oriented Tool for Modelling, Simulation and Production Planning in Petrochemical Industries

    Get PDF
    http://www.cs.hut.fi/~framling/Publications/RPO93.pdfInternational audienceThe subject of this work is to study the use of object oriented models of petrochemical plants in order to si mplify simulation and production planning. A prototype system for an existing plant was developed to achieve this goal. The prototype system was made using the object oriented programming language Smalltalk/V. The system is generic and allows creating graphical simulation models for most kinds of petrochemical plants main ly as a simple drawing operation. The models can be used for simulating production plans as well as for generating them automatically. The results obtained prove that object ori ented methods allow the creation of very flexible models. These models are shown to facilitate simulation and automatic scheduling, when used together with artificial intelligence methods

    Context, Utility and Influence of an Explanation

    Full text link
    Contextual utility theory integrates context-sensitive factors into utility-based decision-making models. It stresses the importance of understanding individual decision-makers' preferences, values, and beliefs and the situational factors that affect them. Contextual utility theory benefits explainable AI. First, it can improve transparency and understanding of how AI systems affect decision-making. It can reveal AI model biases and limitations by considering personal preferences and context. Second, contextual utility theory can make AI systems more personalized and adaptable to users and stakeholders. AI systems can better meet user needs and values by incorporating demographic and cultural data. Finally, contextual utility theory promotes ethical AI development and social responsibility. AI developers can create ethical systems that benefit society by considering contextual factors like societal norms and values. This work, demonstrates how contextual utility theory can improve AI system transparency, personalization, and ethics, benefiting both users and developers.Comment: 6 page

    Addressing information flow in lean production management and control in construction

    Get PDF
    Traditionally, production control on construction sites has been a challenging area, where the ad-hoc production control methods foster uncertainty - one of the biggest enemies of efficiency and smooth production flow. Lean construction methods such as the Last Planner System have partially tackled this problem by addressing the flow aspect through means such as constraints analysis and commitment planning. However, such systems have relatively long planning cycles to respond to the dynamic production requirements of construction, where almost daily if not hourly control is needed. New solutions have been designed by researchers to improve this aspect such as VisiLean, but again these types of software systems require the proximity and availability of computer devices to workers. Given this observation, there is a need for a communication system between the field and site office that is highly interoperable and provides real-time task status information. A High-level communication framework (using VisiLean) is presented in this paper, which aims to overcome the problems of system integration and improve the flow of information within the production system. The framework provides, among other things, generic and standardized interfaces to simplify the “push” and “pull” of the right (production) information, whenever needed, wherever needed, by whoever needs it. Overall, it is anticipated that the reliability of the production control will be improve

    Do intermediate feature coalitions aid explainability of black-box models?

    Full text link
    This work introduces the notion of intermediate concepts based on levels structure to aid explainability for black-box models. The levels structure is a hierarchical structure in which each level corresponds to features of a dataset (i.e., a player-set partition). The level of coarseness increases from the trivial set, which only comprises singletons, to the set, which only contains the grand coalition. In addition, it is possible to establish meronomies, i.e., part-whole relationships, via a domain expert that can be utilised to generate explanations at an abstract level. We illustrate the usability of this approach in a real-world car model example and the Titanic dataset, where intermediate concepts aid in explainability at different levels of abstraction.Comment: 14 pages,The 1st World Conference on eXplainable Artificial Intelligence, 202

    IoTEF: A Federated Edge-Cloud Architecture for Fault-Tolerant IoT Applications

    Get PDF
    The evolution of Internet of Things (IoT) technology has led to an increased emphasis on edge computing for Cyber-Physical Systems (CPS), in which applications rely on processing data closer to the data sources, and sharing the results across heterogeneous clusters. This has simplified the data exchanges between IoT/CPS systems, the cloud, and the edge for managing low latency, minimal bandwidth, and fault-tolerant applications. Nonetheless, many of these applications administer data collection on the edge and offer data analytic and storage capabilities in the cloud. This raises the problem of separate software stacks between the edge and the cloud with no unified fault-tolerant management, hindering dynamic relocation of data processing. In such systems, the data must also be preserved from being corrupted or duplicated in the case of intermittent long-distance network connectivity issues, malicious harming of edge devices, or other hostile environments. Within this context, the contributions of this paper are threefold: (i) to propose a new Internet of Things Edge-Cloud Federation (IoTEF) architecture for multi-cluster IoT applications by adapting our earlier Cloud and Edge Fault-Tolerant IoT (CEFIoT) layered design. We address the fault tolerance issue by employing the Apache Kafka publish/subscribe platform as the unified data replication solution. We also deploy Kubernetes for fault-tolerant management, combined with the federated scheme, offering a single management interface and allowing automatic reconfiguration of the data processing pipeline, (ii) to formulate functional and non-functional requirements of our proposed solution by comparing several IoT architectures, and (iii) to implement a smart buildings use case of the ongoing Otaniemi3D project as proof-of-concept for assessing IoTEF capabilities. The experimental results conclude that the architecture minimizes latency, saves network bandwidth, and handles both hardware and network connectivity based failures.Peer reviewe

    Intelligent Products: Shifting the Production Control Logic in Construction (With Lean and BIM)

    Get PDF
    Production management and control in construction has not been addressed/updated ever since the introduction of Critical Path Method and the Last Planner® system. The predominant outside-in control logic and a fragmented and deep supply chain in construction significantly affect the efficiency over a lifecycle. In a construction project, a large number of organisations interact with the product throughout the process, requiring a significant amount of information handling and synchronisation between these organisations. However, due to the deep supply chains and problems with lack of information integration, the information flow down across the lifecycle poses a significant challenge. This research proposes a product centric system, where the control logic of the production process is embedded within the individual components from the design phase. The solution is enabled by a number of technologies and tools such as Building Information Modelling, Internet of Things, Messaging Systems and within the conceptual process framework of Lean Construction. The vision encompasses the lifecycle of projects from design to construction and maintenance, where the products can interact with the environment and its actors through various stages supporting a variety of actions. The vision and the tools and technologies required to support it are described in this pape

    P2P Data Synchronization for Product Lifecycle Management

    Get PDF
    Abstract Intelligent products are an undeniable asset for efficient Product Lifecycle Management (PLM), providing ways to capture events related to physical objects at various locations and times. Today and more than ever before, PLM tools and systems must be built upon standards for enhancing interoperability among all product stakeholders and developing tools independent of specific vendors, applications, and operating systems. Based on this observation, this paper develops strategies to improve "information sustainability" in PLM environments using standardized communication interfaces defined by a recent standard proposal named Quantum Lifecycle Management (QLM) messaging standards. More concretely, data synchronization models based upon QLM standards are developed to enable the synchronization of product-related information among various systems, networks, and organizations involved throughout the product lifecycle. Our proposals are implemented and assessed based on two distinct platforms defined in the healthcare and home automation sectors
    • …
    corecore